177 research outputs found

    Learning Analytics for 21st Century Competencies

    Full text link

    Design and implementation of a pedagogic intervention using writing analytics

    Full text link
    Ā© 2017 Asia-Pacific Society for Computers in Education. All rights reserved. Academic writing is a key skill required for higher education students, which is often challenging to learn. A promising approach to help students develop this skill is the use of automated tools that provide formative feedback on writing. However, such tools are not widely adopted by students unless useful for their discipline-related writing, and embedded in the curriculum. This recognition motivates an increased emphasis in the field on aligning learning analytics applications with learning design, so that analytics-driven feedback is congruent with the pedagogy and assessment regime. This paper describes the design, implementation, and evaluation of a pedagogic intervention that was developed for law students to make use of an automated Academic Writing Analytics tool (AWA) for improving their academic writing. In exemplifying this pedagogically aligned learning analytic intervention, we describe the development of a learning analytics platform to support the pedagogic design, illustrating its potential through example analyses of data derived from the task

    Hypotheses, evidence and relationships: The HypER approach for representing scientific knowledge claims

    Get PDF
    Biological knowledge is increasingly represented as a collection of (entity-relationship-entity) triplets. These are queried, mined, appended to papers, and published. However, this representation ignores the argumentation contained within a paper and the relationships between hypotheses, claims and evidence put forth in the article. In this paper, we propose an alternate view of the research article as a network of 'hypotheses and evidence'. Our knowledge representation focuses on scientific discourse as a rhetorical activity, which leads to a different direction in the development of tools and processes for modeling this discourse. We propose to extract knowledge from the article to allow the construction of a system where a specific scientific claim is connected, through trails of meaningful relationships, to experimental evidence. We discuss some current efforts and future plans in this area

    Evidence-Based Dialogue Maps as a research tool to evaluate the quality of school pupilsā€™ scientific argumentation

    Get PDF
    This pilot study focuses on the potential of Evidence-based Dialogue Mapping as a participatory action research tool to investigate young teenagersā€™ scientific argumentation. Evidence-based Dialogue Mapping is a technique for representing graphically an argumentative dialogue through Questions, Ideas, Pros, Cons and Data. Our research objective is to better understand the usage of Compendium, a Dialogue Mapping software tool, as both (1) a learning strategy to scaffold school pupilsā€™ argumentation and (2) as a method to investigate the quality of their argumentative essays. The participants were a science teacher-researcher, a knowledge mapping researcher and 20 pupils, 12-13 years old, in a summer science course for ā€œgifted and talentedā€ children in the UK. This study draws on multiple data sources: discussion forum, science teacher-researcherā€™s and pupilsā€™ Dialogue Maps, pupil essays, and reflective comments about the uses of mapping for writing. Through qualitative analysis of two case studies, we examine the role of Evidence-based Dialogue Maps as a mediating tool in scientific reasoning: as conceptual bridges for linking and making knowledge intelligible; as support for the linearisation task of generating a coherent document outline; as a reflective aid to rethinking reasoning in response to teacher feedback; and as a visual language for making arguments tangible via cartographic conventions

    A comparative analysis of the skilled use of automated feedback tools through the lens of teacher feedback literacy

    Get PDF
    Effective learning depends on effective feedback, which in turn requires a set of skills, dispositions and practices on the part of both students and teachers which have been termed feedback literacy. A previously published teacher feedback literacy competency framework has identified what is needed by teachers to implement feedback well. While this framework refers in broad terms to the potential uses of educational technologies, it does not examine in detail the new possibilities of automated feedback (AF) tools, especially those that are open by offering varying degrees of transparency and control to teachers. Using analytics and artificial intelligence, open AF tools permit automated processing and feedback with a speed, precision and scale that exceeds that of humans. This raises important questions about how human and machine feedback can be combined optimally and what is now required of teachers to use such tools skillfully. The paper addresses two research questions: Which teacher feedback competencies are necessary for the skilled use of open AF tools? and What does the skilled use of open AF tools add to our conceptions of teacher feedback competencies? We conduct an analysis of published evidence concerning teachersā€™ use of open AF tools through the lens of teacher feedback literacy, which produces summary matrices revealing relative strengths and weaknesses in the literature, and the relevance of the feedback literacy framework. We conclude firstly, that when used effectively, open AF tools exercise a range of teacher feedback competencies. The paper thus offers a detailed account of the nature of teachersā€™ feedback literacy practices within this context. Secondly, this analysis reveals gaps in the literature, signalling opportunities for future work. Thirdly, we propose several examples of automated feedback literacy, that is, distinctive teacher competencies linked to the skilled use of open AF tools

    Collocated Collaboration Analytics: Principles and Dilemmas for Mining Multimodal Interaction Data

    Full text link
    Ā© 2019, Copyright Ā© 2017 Taylor & Francis Group, LLC. Learning to collaborate effectively requires practice, awareness of group dynamics, and reflection; often it benefits from coaching by an expert facilitator. However, in physical spaces it is not always easy to provide teams with evidence to support collaboration. Emerging technology provides a promising opportunity to make collocated collaboration visible by harnessing data about interactions and then mining and visualizing it. These collocated collaboration analytics can help researchers, designers, and users to understand the complexity of collaboration and to find ways they can support collaboration. This article introduces and motivates a set of principles for mining collocated collaboration data and draws attention to trade-offs that may need to be negotiated en route. We integrate Data Science principles and techniques with the advances in interactive surface devices and sensing technologies. We draw on a 7-year research program that has involved the analysis of six group situations in collocated settings with more than 500 users and a variety of surface technologies, tasks, grouping structures, and domains. The contribution of the article includes the key insights and themes that we have identified and summarized in a set of principles and dilemmas that can inform design of future collocated collaboration analytics innovations

    DBCollab: Automated feedback for face-to-face group database design

    Full text link
    Ā© 2017 Asia-Pacific Society for Computers in Education. All rights reserved. Developing effective teamwork and collaboration skills is regarded as a key graduate attribute for employability. As a result, higher education institutions are striving to help students foster these skills through authentic learning scenarios. Although face-to-face (f2f) group tasks are common in most classrooms, it is challenging to collect evidence about the group processes. As a result, to date, it is difficult to assess group tasks in ways other than through teachers' direct observations and students' self-reports, or by measuring the quality of their final product. However, there are other critical aspects of group-work that students need to receive feedback on, for example, interaction dynamics or the collaboration processes. This paper explores the potential of using interactive surfaces and sensors to track key indicators of group-work, to provide automated feedback about epistemic and social aspects. We conducted a pilot study in an authentic classroom, in the context of database design. The contributions of this paper are: 1) the operationalisation of the DBCollab tool as a means for supporting group database design and collecting multimodal traces of the activity using interactive surfaces and sensors; and 2) empirical evidence that points at the potential of presenting these traces to group members in order to provoke immediate and post-hoc productive reflection about their activity
    • ā€¦
    corecore